[ Wed Sep 28 02:17:36 2022 ] using warm up, epoch: 5
[ Wed Sep 28 02:17:50 2022 ] Parameters:
{'work_dir': 'work_dir/ntu60/csub/fc_bone_vel', 'model_saved_name': 'work_dir/ntu60/csub/fc_bone_vel/runs', 'config': 'config/nturgbd-cross-subject/fc_bone_vel.yaml', 'phase': 'train', 'save_score': False, 'joint_label': [], 'seed': 1, 'log_interval': 100, 'save_interval': 1, 'save_epoch': 35, 'eval_interval': 5, 'ema': False, 'print_log': True, 'show_topk': [1, 5], 'feeder': 'feeders.feeder_ntu.Feeder', 'num_worker': 48, 'train_feeder_args': {'data_path': 'data/ntu60/NTU60_CS.npz', 'split': 'train', 'debug': False, 'random_choose': False, 'random_shift': False, 'random_move': False, 'window_size': 64, 'normalization': False, 'random_rot': True, 'p_interval': [0.5, 1], 'vel': True, 'bone': True}, 'test_feeder_args': {'data_path': 'data/ntu60/NTU60_CS.npz', 'split': 'test', 'window_size': 64, 'p_interval': [0.95], 'vel': True, 'bone': True, 'debug': False}, 'model': 'model.FC-Chains_L_multi_head_new_12_layers.Model', 'model_args': {'num_class': 60, 'num_point': 25, 'num_person': 2}, 'weights': None, 'ignore_weights': [], 'base_lr': 0.1, 'step': [90, 100], 'device': [3], 'optimizer': 'SGD', 'nesterov': True, 'momentum': 0.9, 'batch_size': 64, 'test_batch_size': 64, 'start_epoch': 0, 'num_epoch': 110, 'weight_decay': 0.0004, 'lr_decay_rate': 0.1, 'warm_up_epoch': 5}

[ Wed Sep 28 02:17:50 2022 ] # Parameters: 2082097
[ Wed Sep 28 02:17:50 2022 ] Training epoch: 1
[ Wed Sep 28 02:21:01 2022 ] 	Mean training loss: 2.7909. loss2: 0.0000. Mean training acc: 26.16%.
[ Wed Sep 28 02:21:01 2022 ] 	Time consumption: [Data]01%, [Network]98%
[ Wed Sep 28 02:21:01 2022 ] Eval epoch: 1
[ Wed Sep 28 02:21:30 2022 ] 	Mean test loss of 258 batches: 1.8860546267309855.
[ Wed Sep 28 02:21:31 2022 ] 	Top1: 46.05%
[ Wed Sep 28 02:21:31 2022 ] 	Top5: 81.36%
[ Wed Sep 28 02:21:31 2022 ] Training epoch: 2
[ Wed Sep 28 02:24:38 2022 ] 	Mean training loss: 1.6526. loss2: 0.0000. Mean training acc: 50.44%.
[ Wed Sep 28 02:24:38 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 02:24:38 2022 ] Eval epoch: 2
[ Wed Sep 28 02:25:07 2022 ] 	Mean test loss of 258 batches: 1.3085940678914387.
[ Wed Sep 28 02:25:07 2022 ] 	Top1: 59.55%
[ Wed Sep 28 02:25:07 2022 ] 	Top5: 90.35%
[ Wed Sep 28 02:25:07 2022 ] Training epoch: 3
[ Wed Sep 28 02:28:15 2022 ] 	Mean training loss: 1.2722. loss2: 0.0000. Mean training acc: 61.32%.
[ Wed Sep 28 02:28:15 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 02:28:15 2022 ] Eval epoch: 3
[ Wed Sep 28 02:28:44 2022 ] 	Mean test loss of 258 batches: 1.2221124495646751.
[ Wed Sep 28 02:28:44 2022 ] 	Top1: 63.23%
[ Wed Sep 28 02:28:44 2022 ] 	Top5: 91.42%
[ Wed Sep 28 02:28:44 2022 ] Training epoch: 4
[ Wed Sep 28 02:31:51 2022 ] 	Mean training loss: 1.0830. loss2: 0.0000. Mean training acc: 66.11%.
[ Wed Sep 28 02:31:51 2022 ] 	Time consumption: [Data]01%, [Network]98%
[ Wed Sep 28 02:31:51 2022 ] Eval epoch: 4
[ Wed Sep 28 02:32:20 2022 ] 	Mean test loss of 258 batches: 1.1832145785176478.
[ Wed Sep 28 02:32:20 2022 ] 	Top1: 64.65%
[ Wed Sep 28 02:32:20 2022 ] 	Top5: 91.61%
[ Wed Sep 28 02:32:20 2022 ] Training epoch: 5
[ Wed Sep 28 02:35:27 2022 ] 	Mean training loss: 0.9801. loss2: 0.0000. Mean training acc: 69.43%.
[ Wed Sep 28 02:35:27 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 02:35:27 2022 ] Eval epoch: 5
[ Wed Sep 28 02:35:57 2022 ] 	Mean test loss of 258 batches: 1.3576540658178256.
[ Wed Sep 28 02:35:57 2022 ] 	Top1: 62.45%
[ Wed Sep 28 02:35:57 2022 ] 	Top5: 90.53%
[ Wed Sep 28 02:35:57 2022 ] Training epoch: 6
[ Wed Sep 28 02:39:04 2022 ] 	Mean training loss: 0.8705. loss2: 0.0000. Mean training acc: 72.40%.
[ Wed Sep 28 02:39:04 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 02:39:04 2022 ] Eval epoch: 6
[ Wed Sep 28 02:39:33 2022 ] 	Mean test loss of 258 batches: 1.1503793012726216.
[ Wed Sep 28 02:39:33 2022 ] 	Top1: 64.86%
[ Wed Sep 28 02:39:33 2022 ] 	Top5: 92.56%
[ Wed Sep 28 02:39:33 2022 ] Training epoch: 7
[ Wed Sep 28 02:42:41 2022 ] 	Mean training loss: 0.8203. loss2: 0.0000. Mean training acc: 74.32%.
[ Wed Sep 28 02:42:41 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 02:42:41 2022 ] Eval epoch: 7
[ Wed Sep 28 02:43:10 2022 ] 	Mean test loss of 258 batches: 1.0441635065069494.
[ Wed Sep 28 02:43:10 2022 ] 	Top1: 69.13%
[ Wed Sep 28 02:43:10 2022 ] 	Top5: 93.04%
[ Wed Sep 28 02:43:10 2022 ] Training epoch: 8
[ Wed Sep 28 02:46:18 2022 ] 	Mean training loss: 0.7818. loss2: 0.0000. Mean training acc: 75.25%.
[ Wed Sep 28 02:46:18 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 02:46:18 2022 ] Eval epoch: 8
[ Wed Sep 28 02:46:47 2022 ] 	Mean test loss of 258 batches: 0.921666455476783.
[ Wed Sep 28 02:46:47 2022 ] 	Top1: 72.55%
[ Wed Sep 28 02:46:47 2022 ] 	Top5: 94.05%
[ Wed Sep 28 02:46:47 2022 ] Training epoch: 9
[ Wed Sep 28 02:49:54 2022 ] 	Mean training loss: 0.7430. loss2: 0.0000. Mean training acc: 76.60%.
[ Wed Sep 28 02:49:54 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 02:49:54 2022 ] Eval epoch: 9
[ Wed Sep 28 02:50:23 2022 ] 	Mean test loss of 258 batches: 1.1126988897489947.
[ Wed Sep 28 02:50:23 2022 ] 	Top1: 68.80%
[ Wed Sep 28 02:50:23 2022 ] 	Top5: 91.64%
[ Wed Sep 28 02:50:23 2022 ] Training epoch: 10
[ Wed Sep 28 02:53:31 2022 ] 	Mean training loss: 0.7315. loss2: 0.0000. Mean training acc: 77.11%.
[ Wed Sep 28 02:53:31 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 02:53:31 2022 ] Eval epoch: 10
[ Wed Sep 28 02:54:00 2022 ] 	Mean test loss of 258 batches: 0.8749126616374466.
[ Wed Sep 28 02:54:00 2022 ] 	Top1: 73.20%
[ Wed Sep 28 02:54:00 2022 ] 	Top5: 94.69%
[ Wed Sep 28 02:54:00 2022 ] Training epoch: 11
[ Wed Sep 28 02:57:08 2022 ] 	Mean training loss: 0.7091. loss2: 0.0000. Mean training acc: 77.55%.
[ Wed Sep 28 02:57:08 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 02:57:08 2022 ] Eval epoch: 11
[ Wed Sep 28 02:57:37 2022 ] 	Mean test loss of 258 batches: 0.8169362198474796.
[ Wed Sep 28 02:57:37 2022 ] 	Top1: 75.59%
[ Wed Sep 28 02:57:37 2022 ] 	Top5: 95.23%
[ Wed Sep 28 02:57:37 2022 ] Training epoch: 12
[ Wed Sep 28 03:00:44 2022 ] 	Mean training loss: 0.7001. loss2: 0.0000. Mean training acc: 77.89%.
[ Wed Sep 28 03:00:44 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 03:00:44 2022 ] Eval epoch: 12
[ Wed Sep 28 03:01:13 2022 ] 	Mean test loss of 258 batches: 0.8825628332389418.
[ Wed Sep 28 03:01:13 2022 ] 	Top1: 73.56%
[ Wed Sep 28 03:01:14 2022 ] 	Top5: 94.54%
[ Wed Sep 28 03:01:14 2022 ] Training epoch: 13
[ Wed Sep 28 03:04:22 2022 ] 	Mean training loss: 0.6836. loss2: 0.0000. Mean training acc: 78.46%.
[ Wed Sep 28 03:04:22 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 03:04:22 2022 ] Eval epoch: 13
[ Wed Sep 28 03:04:51 2022 ] 	Mean test loss of 258 batches: 0.9174293269251668.
[ Wed Sep 28 03:04:51 2022 ] 	Top1: 72.16%
[ Wed Sep 28 03:04:51 2022 ] 	Top5: 94.30%
[ Wed Sep 28 03:04:51 2022 ] Training epoch: 14
[ Wed Sep 28 03:07:59 2022 ] 	Mean training loss: 0.6709. loss2: 0.0000. Mean training acc: 78.80%.
[ Wed Sep 28 03:07:59 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 03:07:59 2022 ] Eval epoch: 14
[ Wed Sep 28 03:08:28 2022 ] 	Mean test loss of 258 batches: 0.9741951023885446.
[ Wed Sep 28 03:08:28 2022 ] 	Top1: 71.00%
[ Wed Sep 28 03:08:28 2022 ] 	Top5: 94.60%
[ Wed Sep 28 03:08:28 2022 ] Training epoch: 15
[ Wed Sep 28 03:11:35 2022 ] 	Mean training loss: 0.6583. loss2: 0.0000. Mean training acc: 79.17%.
[ Wed Sep 28 03:11:35 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 03:11:35 2022 ] Eval epoch: 15
[ Wed Sep 28 03:12:04 2022 ] 	Mean test loss of 258 batches: 1.0733356270217156.
[ Wed Sep 28 03:12:05 2022 ] 	Top1: 70.21%
[ Wed Sep 28 03:12:05 2022 ] 	Top5: 92.84%
[ Wed Sep 28 03:12:05 2022 ] Training epoch: 16
[ Wed Sep 28 03:15:12 2022 ] 	Mean training loss: 0.6374. loss2: 0.0000. Mean training acc: 79.85%.
[ Wed Sep 28 03:15:12 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 03:15:12 2022 ] Eval epoch: 16
[ Wed Sep 28 03:15:41 2022 ] 	Mean test loss of 258 batches: 0.8571511920570403.
[ Wed Sep 28 03:15:41 2022 ] 	Top1: 73.29%
[ Wed Sep 28 03:15:41 2022 ] 	Top5: 95.06%
[ Wed Sep 28 03:15:41 2022 ] Training epoch: 17
[ Wed Sep 28 03:18:49 2022 ] 	Mean training loss: 0.6294. loss2: 0.0000. Mean training acc: 80.02%.
[ Wed Sep 28 03:18:49 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 03:18:49 2022 ] Eval epoch: 17
[ Wed Sep 28 03:19:18 2022 ] 	Mean test loss of 258 batches: 0.9555923341549644.
[ Wed Sep 28 03:19:18 2022 ] 	Top1: 72.63%
[ Wed Sep 28 03:19:18 2022 ] 	Top5: 94.67%
[ Wed Sep 28 03:19:18 2022 ] Training epoch: 18
[ Wed Sep 28 03:22:26 2022 ] 	Mean training loss: 0.6229. loss2: 0.0000. Mean training acc: 80.24%.
[ Wed Sep 28 03:22:26 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 03:22:26 2022 ] Eval epoch: 18
[ Wed Sep 28 03:22:55 2022 ] 	Mean test loss of 258 batches: 0.7374700748527697.
[ Wed Sep 28 03:22:55 2022 ] 	Top1: 76.99%
[ Wed Sep 28 03:22:55 2022 ] 	Top5: 95.83%
[ Wed Sep 28 03:22:55 2022 ] Training epoch: 19
[ Wed Sep 28 03:26:03 2022 ] 	Mean training loss: 0.6206. loss2: 0.0000. Mean training acc: 80.49%.
[ Wed Sep 28 03:26:03 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 03:26:03 2022 ] Eval epoch: 19
[ Wed Sep 28 03:26:32 2022 ] 	Mean test loss of 258 batches: 0.8249692661586658.
[ Wed Sep 28 03:26:32 2022 ] 	Top1: 76.44%
[ Wed Sep 28 03:26:32 2022 ] 	Top5: 94.85%
[ Wed Sep 28 03:26:32 2022 ] Training epoch: 20
[ Wed Sep 28 03:29:39 2022 ] 	Mean training loss: 0.6094. loss2: 0.0000. Mean training acc: 80.75%.
[ Wed Sep 28 03:29:39 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 03:29:39 2022 ] Eval epoch: 20
[ Wed Sep 28 03:30:09 2022 ] 	Mean test loss of 258 batches: 0.7717191666480183.
[ Wed Sep 28 03:30:09 2022 ] 	Top1: 76.41%
[ Wed Sep 28 03:30:09 2022 ] 	Top5: 95.57%
[ Wed Sep 28 03:30:09 2022 ] Training epoch: 21
[ Wed Sep 28 03:33:17 2022 ] 	Mean training loss: 0.6037. loss2: 0.0000. Mean training acc: 81.02%.
[ Wed Sep 28 03:33:17 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 03:33:17 2022 ] Eval epoch: 21
[ Wed Sep 28 03:33:46 2022 ] 	Mean test loss of 258 batches: 0.8022673135811045.
[ Wed Sep 28 03:33:46 2022 ] 	Top1: 77.02%
[ Wed Sep 28 03:33:46 2022 ] 	Top5: 95.14%
[ Wed Sep 28 03:33:46 2022 ] Training epoch: 22
[ Wed Sep 28 03:36:54 2022 ] 	Mean training loss: 0.6005. loss2: 0.0000. Mean training acc: 81.03%.
[ Wed Sep 28 03:36:54 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 03:36:54 2022 ] Eval epoch: 22
[ Wed Sep 28 03:37:23 2022 ] 	Mean test loss of 258 batches: 0.7511092385118322.
[ Wed Sep 28 03:37:23 2022 ] 	Top1: 77.82%
[ Wed Sep 28 03:37:23 2022 ] 	Top5: 96.01%
[ Wed Sep 28 03:37:23 2022 ] Training epoch: 23
[ Wed Sep 28 03:40:31 2022 ] 	Mean training loss: 0.5993. loss2: 0.0000. Mean training acc: 81.25%.
[ Wed Sep 28 03:40:31 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 03:40:31 2022 ] Eval epoch: 23
[ Wed Sep 28 03:41:00 2022 ] 	Mean test loss of 258 batches: 0.8765650869801987.
[ Wed Sep 28 03:41:00 2022 ] 	Top1: 74.84%
[ Wed Sep 28 03:41:00 2022 ] 	Top5: 94.96%
[ Wed Sep 28 03:41:00 2022 ] Training epoch: 24
[ Wed Sep 28 03:44:08 2022 ] 	Mean training loss: 0.5848. loss2: 0.0000. Mean training acc: 81.51%.
[ Wed Sep 28 03:44:08 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 03:44:08 2022 ] Eval epoch: 24
[ Wed Sep 28 03:44:37 2022 ] 	Mean test loss of 258 batches: 0.9137649703626485.
[ Wed Sep 28 03:44:37 2022 ] 	Top1: 73.74%
[ Wed Sep 28 03:44:37 2022 ] 	Top5: 93.85%
[ Wed Sep 28 03:44:37 2022 ] Training epoch: 25
[ Wed Sep 28 03:47:45 2022 ] 	Mean training loss: 0.5836. loss2: 0.0000. Mean training acc: 81.69%.
[ Wed Sep 28 03:47:45 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 03:47:45 2022 ] Eval epoch: 25
[ Wed Sep 28 03:48:14 2022 ] 	Mean test loss of 258 batches: 0.7148413542051648.
[ Wed Sep 28 03:48:14 2022 ] 	Top1: 78.60%
[ Wed Sep 28 03:48:14 2022 ] 	Top5: 95.44%
[ Wed Sep 28 03:48:14 2022 ] Training epoch: 26
[ Wed Sep 28 03:51:22 2022 ] 	Mean training loss: 0.5858. loss2: 0.0000. Mean training acc: 81.38%.
[ Wed Sep 28 03:51:22 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 03:51:22 2022 ] Eval epoch: 26
[ Wed Sep 28 03:51:51 2022 ] 	Mean test loss of 258 batches: 0.7243685513272766.
[ Wed Sep 28 03:51:51 2022 ] 	Top1: 78.07%
[ Wed Sep 28 03:51:51 2022 ] 	Top5: 96.17%
[ Wed Sep 28 03:51:51 2022 ] Training epoch: 27
[ Wed Sep 28 03:54:59 2022 ] 	Mean training loss: 0.5862. loss2: 0.0000. Mean training acc: 81.32%.
[ Wed Sep 28 03:54:59 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 03:54:59 2022 ] Eval epoch: 27
[ Wed Sep 28 03:55:28 2022 ] 	Mean test loss of 258 batches: 0.8009421847933946.
[ Wed Sep 28 03:55:28 2022 ] 	Top1: 75.68%
[ Wed Sep 28 03:55:28 2022 ] 	Top5: 95.60%
[ Wed Sep 28 03:55:28 2022 ] Training epoch: 28
[ Wed Sep 28 03:58:36 2022 ] 	Mean training loss: 0.5733. loss2: 0.0000. Mean training acc: 81.88%.
[ Wed Sep 28 03:58:36 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 03:58:36 2022 ] Eval epoch: 28
[ Wed Sep 28 03:59:05 2022 ] 	Mean test loss of 258 batches: 0.7401831271921018.
[ Wed Sep 28 03:59:05 2022 ] 	Top1: 77.51%
[ Wed Sep 28 03:59:05 2022 ] 	Top5: 95.85%
[ Wed Sep 28 03:59:05 2022 ] Training epoch: 29
[ Wed Sep 28 04:02:12 2022 ] 	Mean training loss: 0.5706. loss2: 0.0000. Mean training acc: 81.86%.
[ Wed Sep 28 04:02:12 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 04:02:12 2022 ] Eval epoch: 29
[ Wed Sep 28 04:02:42 2022 ] 	Mean test loss of 258 batches: 0.8281250408915586.
[ Wed Sep 28 04:02:42 2022 ] 	Top1: 74.30%
[ Wed Sep 28 04:02:42 2022 ] 	Top5: 95.89%
[ Wed Sep 28 04:02:42 2022 ] Training epoch: 30
[ Wed Sep 28 04:05:49 2022 ] 	Mean training loss: 0.5688. loss2: 0.0000. Mean training acc: 82.05%.
[ Wed Sep 28 04:05:49 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 04:05:49 2022 ] Eval epoch: 30
[ Wed Sep 28 04:06:19 2022 ] 	Mean test loss of 258 batches: 1.0159890384868133.
[ Wed Sep 28 04:06:19 2022 ] 	Top1: 71.13%
[ Wed Sep 28 04:06:19 2022 ] 	Top5: 94.19%
[ Wed Sep 28 04:06:19 2022 ] Training epoch: 31
[ Wed Sep 28 04:09:26 2022 ] 	Mean training loss: 0.5725. loss2: 0.0000. Mean training acc: 81.77%.
[ Wed Sep 28 04:09:26 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 04:09:26 2022 ] Eval epoch: 31
[ Wed Sep 28 04:09:55 2022 ] 	Mean test loss of 258 batches: 0.839599169155424.
[ Wed Sep 28 04:09:56 2022 ] 	Top1: 75.37%
[ Wed Sep 28 04:09:56 2022 ] 	Top5: 95.37%
[ Wed Sep 28 04:09:56 2022 ] Training epoch: 32
[ Wed Sep 28 04:13:04 2022 ] 	Mean training loss: 0.5649. loss2: 0.0000. Mean training acc: 81.96%.
[ Wed Sep 28 04:13:04 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 04:13:04 2022 ] Eval epoch: 32
[ Wed Sep 28 04:13:33 2022 ] 	Mean test loss of 258 batches: 0.6822236321808756.
[ Wed Sep 28 04:13:33 2022 ] 	Top1: 79.68%
[ Wed Sep 28 04:13:33 2022 ] 	Top5: 96.15%
[ Wed Sep 28 04:13:33 2022 ] Training epoch: 33
[ Wed Sep 28 04:16:40 2022 ] 	Mean training loss: 0.5710. loss2: 0.0000. Mean training acc: 82.07%.
[ Wed Sep 28 04:16:40 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 04:16:40 2022 ] Eval epoch: 33
[ Wed Sep 28 04:17:10 2022 ] 	Mean test loss of 258 batches: 0.882549482838128.
[ Wed Sep 28 04:17:10 2022 ] 	Top1: 74.25%
[ Wed Sep 28 04:17:10 2022 ] 	Top5: 94.57%
[ Wed Sep 28 04:17:10 2022 ] Training epoch: 34
[ Wed Sep 28 04:20:18 2022 ] 	Mean training loss: 0.5632. loss2: 0.0000. Mean training acc: 82.06%.
[ Wed Sep 28 04:20:18 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 04:20:18 2022 ] Eval epoch: 34
[ Wed Sep 28 04:20:47 2022 ] 	Mean test loss of 258 batches: 0.6654378868812738.
[ Wed Sep 28 04:20:47 2022 ] 	Top1: 79.56%
[ Wed Sep 28 04:20:47 2022 ] 	Top5: 96.71%
[ Wed Sep 28 04:20:47 2022 ] Training epoch: 35
[ Wed Sep 28 04:23:54 2022 ] 	Mean training loss: 0.5573. loss2: 0.0000. Mean training acc: 82.40%.
[ Wed Sep 28 04:23:54 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 04:23:54 2022 ] Eval epoch: 35
[ Wed Sep 28 04:24:24 2022 ] 	Mean test loss of 258 batches: 0.7855600592817447.
[ Wed Sep 28 04:24:24 2022 ] 	Top1: 76.24%
[ Wed Sep 28 04:24:24 2022 ] 	Top5: 95.48%
[ Wed Sep 28 04:24:24 2022 ] Training epoch: 36
[ Wed Sep 28 04:27:31 2022 ] 	Mean training loss: 0.5579. loss2: 0.0000. Mean training acc: 82.43%.
[ Wed Sep 28 04:27:31 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 04:27:31 2022 ] Eval epoch: 36
[ Wed Sep 28 04:28:00 2022 ] 	Mean test loss of 258 batches: 0.7698285987441854.
[ Wed Sep 28 04:28:01 2022 ] 	Top1: 77.15%
[ Wed Sep 28 04:28:01 2022 ] 	Top5: 95.34%
[ Wed Sep 28 04:28:01 2022 ] Training epoch: 37
[ Wed Sep 28 04:31:08 2022 ] 	Mean training loss: 0.5520. loss2: 0.0000. Mean training acc: 82.61%.
[ Wed Sep 28 04:31:08 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 04:31:08 2022 ] Eval epoch: 37
[ Wed Sep 28 04:31:38 2022 ] 	Mean test loss of 258 batches: 0.8742230585379194.
[ Wed Sep 28 04:31:38 2022 ] 	Top1: 74.79%
[ Wed Sep 28 04:31:38 2022 ] 	Top5: 94.61%
[ Wed Sep 28 04:31:38 2022 ] Training epoch: 38
[ Wed Sep 28 04:34:45 2022 ] 	Mean training loss: 0.5545. loss2: 0.0000. Mean training acc: 82.55%.
[ Wed Sep 28 04:34:45 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 04:34:46 2022 ] Eval epoch: 38
[ Wed Sep 28 04:35:15 2022 ] 	Mean test loss of 258 batches: 0.7593881263173827.
[ Wed Sep 28 04:35:15 2022 ] 	Top1: 77.30%
[ Wed Sep 28 04:35:15 2022 ] 	Top5: 95.85%
[ Wed Sep 28 04:35:15 2022 ] Training epoch: 39
[ Wed Sep 28 04:38:22 2022 ] 	Mean training loss: 0.5463. loss2: 0.0000. Mean training acc: 82.97%.
[ Wed Sep 28 04:38:22 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 04:38:23 2022 ] Eval epoch: 39
[ Wed Sep 28 04:38:52 2022 ] 	Mean test loss of 258 batches: 0.7942219800496286.
[ Wed Sep 28 04:38:52 2022 ] 	Top1: 76.61%
[ Wed Sep 28 04:38:52 2022 ] 	Top5: 96.09%
[ Wed Sep 28 04:38:52 2022 ] Training epoch: 40
[ Wed Sep 28 04:42:00 2022 ] 	Mean training loss: 0.5539. loss2: 0.0000. Mean training acc: 82.49%.
[ Wed Sep 28 04:42:00 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 04:42:00 2022 ] Eval epoch: 40
[ Wed Sep 28 04:42:29 2022 ] 	Mean test loss of 258 batches: 0.8108983974355136.
[ Wed Sep 28 04:42:29 2022 ] 	Top1: 76.30%
[ Wed Sep 28 04:42:29 2022 ] 	Top5: 95.37%
[ Wed Sep 28 04:42:29 2022 ] Training epoch: 41
[ Wed Sep 28 04:45:38 2022 ] 	Mean training loss: 0.5513. loss2: 0.0000. Mean training acc: 82.48%.
[ Wed Sep 28 04:45:38 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 04:45:38 2022 ] Eval epoch: 41
[ Wed Sep 28 04:46:07 2022 ] 	Mean test loss of 258 batches: 0.8762773546253064.
[ Wed Sep 28 04:46:07 2022 ] 	Top1: 74.71%
[ Wed Sep 28 04:46:07 2022 ] 	Top5: 94.63%
[ Wed Sep 28 04:46:07 2022 ] Training epoch: 42
[ Wed Sep 28 04:49:14 2022 ] 	Mean training loss: 0.5478. loss2: 0.0000. Mean training acc: 82.61%.
[ Wed Sep 28 04:49:14 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 04:49:14 2022 ] Eval epoch: 42
[ Wed Sep 28 04:49:44 2022 ] 	Mean test loss of 258 batches: 0.6644383596126423.
[ Wed Sep 28 04:49:44 2022 ] 	Top1: 80.21%
[ Wed Sep 28 04:49:44 2022 ] 	Top5: 96.42%
[ Wed Sep 28 04:49:44 2022 ] Training epoch: 43
[ Wed Sep 28 04:52:51 2022 ] 	Mean training loss: 0.5475. loss2: 0.0000. Mean training acc: 82.67%.
[ Wed Sep 28 04:52:51 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 04:52:51 2022 ] Eval epoch: 43
[ Wed Sep 28 04:53:21 2022 ] 	Mean test loss of 258 batches: 0.7966352991124456.
[ Wed Sep 28 04:53:21 2022 ] 	Top1: 75.56%
[ Wed Sep 28 04:53:21 2022 ] 	Top5: 95.72%
[ Wed Sep 28 04:53:21 2022 ] Training epoch: 44
[ Wed Sep 28 04:56:29 2022 ] 	Mean training loss: 0.5492. loss2: 0.0000. Mean training acc: 82.73%.
[ Wed Sep 28 04:56:29 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 04:56:30 2022 ] Eval epoch: 44
[ Wed Sep 28 04:56:59 2022 ] 	Mean test loss of 258 batches: 0.9743735023247179.
[ Wed Sep 28 04:56:59 2022 ] 	Top1: 72.54%
[ Wed Sep 28 04:56:59 2022 ] 	Top5: 94.03%
[ Wed Sep 28 04:56:59 2022 ] Training epoch: 45
[ Wed Sep 28 05:00:07 2022 ] 	Mean training loss: 0.5438. loss2: 0.0000. Mean training acc: 82.74%.
[ Wed Sep 28 05:00:07 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 05:00:07 2022 ] Eval epoch: 45
[ Wed Sep 28 05:00:36 2022 ] 	Mean test loss of 258 batches: 0.8306974876065587.
[ Wed Sep 28 05:00:36 2022 ] 	Top1: 75.93%
[ Wed Sep 28 05:00:36 2022 ] 	Top5: 95.28%
[ Wed Sep 28 05:00:36 2022 ] Training epoch: 46
[ Wed Sep 28 05:03:44 2022 ] 	Mean training loss: 0.5406. loss2: 0.0000. Mean training acc: 82.88%.
[ Wed Sep 28 05:03:44 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 05:03:44 2022 ] Eval epoch: 46
[ Wed Sep 28 05:04:13 2022 ] 	Mean test loss of 258 batches: 0.8138134217308474.
[ Wed Sep 28 05:04:13 2022 ] 	Top1: 75.84%
[ Wed Sep 28 05:04:13 2022 ] 	Top5: 95.21%
[ Wed Sep 28 05:04:13 2022 ] Training epoch: 47
[ Wed Sep 28 05:07:21 2022 ] 	Mean training loss: 0.5421. loss2: 0.0000. Mean training acc: 82.92%.
[ Wed Sep 28 05:07:21 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 05:07:21 2022 ] Eval epoch: 47
[ Wed Sep 28 05:07:50 2022 ] 	Mean test loss of 258 batches: 0.9402220967666123.
[ Wed Sep 28 05:07:50 2022 ] 	Top1: 72.78%
[ Wed Sep 28 05:07:50 2022 ] 	Top5: 94.05%
[ Wed Sep 28 05:07:50 2022 ] Training epoch: 48
[ Wed Sep 28 05:10:58 2022 ] 	Mean training loss: 0.5441. loss2: 0.0000. Mean training acc: 82.85%.
[ Wed Sep 28 05:10:58 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 05:10:58 2022 ] Eval epoch: 48
[ Wed Sep 28 05:11:27 2022 ] 	Mean test loss of 258 batches: 0.8878881501597028.
[ Wed Sep 28 05:11:27 2022 ] 	Top1: 75.14%
[ Wed Sep 28 05:11:27 2022 ] 	Top5: 94.86%
[ Wed Sep 28 05:11:27 2022 ] Training epoch: 49
[ Wed Sep 28 05:14:35 2022 ] 	Mean training loss: 0.5408. loss2: 0.0000. Mean training acc: 82.83%.
[ Wed Sep 28 05:14:35 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 05:14:35 2022 ] Eval epoch: 49
[ Wed Sep 28 05:15:04 2022 ] 	Mean test loss of 258 batches: 0.8467806165879087.
[ Wed Sep 28 05:15:04 2022 ] 	Top1: 75.22%
[ Wed Sep 28 05:15:04 2022 ] 	Top5: 95.66%
[ Wed Sep 28 05:15:04 2022 ] Training epoch: 50
[ Wed Sep 28 05:18:12 2022 ] 	Mean training loss: 0.5349. loss2: 0.0000. Mean training acc: 83.01%.
[ Wed Sep 28 05:18:12 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 05:18:12 2022 ] Eval epoch: 50
[ Wed Sep 28 05:18:41 2022 ] 	Mean test loss of 258 batches: 0.6665565867063611.
[ Wed Sep 28 05:18:42 2022 ] 	Top1: 79.93%
[ Wed Sep 28 05:18:42 2022 ] 	Top5: 96.58%
[ Wed Sep 28 05:18:42 2022 ] Training epoch: 51
[ Wed Sep 28 05:21:49 2022 ] 	Mean training loss: 0.5411. loss2: 0.0000. Mean training acc: 82.88%.
[ Wed Sep 28 05:21:49 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 05:21:49 2022 ] Eval epoch: 51
[ Wed Sep 28 05:22:18 2022 ] 	Mean test loss of 258 batches: 0.8298563154399857.
[ Wed Sep 28 05:22:19 2022 ] 	Top1: 75.98%
[ Wed Sep 28 05:22:19 2022 ] 	Top5: 95.04%
[ Wed Sep 28 05:22:19 2022 ] Training epoch: 52
[ Wed Sep 28 05:25:26 2022 ] 	Mean training loss: 0.5372. loss2: 0.0000. Mean training acc: 82.93%.
[ Wed Sep 28 05:25:26 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 05:25:26 2022 ] Eval epoch: 52
[ Wed Sep 28 05:25:56 2022 ] 	Mean test loss of 258 batches: 0.7140483437466991.
[ Wed Sep 28 05:25:56 2022 ] 	Top1: 79.43%
[ Wed Sep 28 05:25:56 2022 ] 	Top5: 96.27%
[ Wed Sep 28 05:25:56 2022 ] Training epoch: 53
[ Wed Sep 28 05:29:03 2022 ] 	Mean training loss: 0.5307. loss2: 0.0000. Mean training acc: 83.36%.
[ Wed Sep 28 05:29:03 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 05:29:03 2022 ] Eval epoch: 53
[ Wed Sep 28 05:29:32 2022 ] 	Mean test loss of 258 batches: 0.6954675618876782.
[ Wed Sep 28 05:29:33 2022 ] 	Top1: 78.80%
[ Wed Sep 28 05:29:33 2022 ] 	Top5: 96.35%
[ Wed Sep 28 05:29:33 2022 ] Training epoch: 54
[ Wed Sep 28 05:32:41 2022 ] 	Mean training loss: 0.5348. loss2: 0.0000. Mean training acc: 83.25%.
[ Wed Sep 28 05:32:41 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 05:32:41 2022 ] Eval epoch: 54
[ Wed Sep 28 05:33:10 2022 ] 	Mean test loss of 258 batches: 0.7251851483601932.
[ Wed Sep 28 05:33:10 2022 ] 	Top1: 77.74%
[ Wed Sep 28 05:33:10 2022 ] 	Top5: 96.32%
[ Wed Sep 28 05:33:10 2022 ] Training epoch: 55
[ Wed Sep 28 05:36:17 2022 ] 	Mean training loss: 0.5433. loss2: 0.0000. Mean training acc: 82.92%.
[ Wed Sep 28 05:36:17 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 05:36:17 2022 ] Eval epoch: 55
[ Wed Sep 28 05:36:47 2022 ] 	Mean test loss of 258 batches: 0.9494028131860172.
[ Wed Sep 28 05:36:47 2022 ] 	Top1: 72.73%
[ Wed Sep 28 05:36:47 2022 ] 	Top5: 94.16%
[ Wed Sep 28 05:36:47 2022 ] Training epoch: 56
[ Wed Sep 28 05:39:54 2022 ] 	Mean training loss: 0.5317. loss2: 0.0000. Mean training acc: 83.18%.
[ Wed Sep 28 05:39:54 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 05:39:55 2022 ] Eval epoch: 56
[ Wed Sep 28 05:40:24 2022 ] 	Mean test loss of 258 batches: 0.7099673115005789.
[ Wed Sep 28 05:40:24 2022 ] 	Top1: 79.05%
[ Wed Sep 28 05:40:24 2022 ] 	Top5: 95.91%
[ Wed Sep 28 05:40:24 2022 ] Training epoch: 57
[ Wed Sep 28 05:43:31 2022 ] 	Mean training loss: 0.5324. loss2: 0.0000. Mean training acc: 83.36%.
[ Wed Sep 28 05:43:31 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 05:43:31 2022 ] Eval epoch: 57
[ Wed Sep 28 05:44:01 2022 ] 	Mean test loss of 258 batches: 0.7226417258728383.
[ Wed Sep 28 05:44:01 2022 ] 	Top1: 78.28%
[ Wed Sep 28 05:44:01 2022 ] 	Top5: 96.03%
[ Wed Sep 28 05:44:01 2022 ] Training epoch: 58
[ Wed Sep 28 05:47:08 2022 ] 	Mean training loss: 0.5309. loss2: 0.0000. Mean training acc: 83.14%.
[ Wed Sep 28 05:47:08 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 05:47:08 2022 ] Eval epoch: 58
[ Wed Sep 28 05:47:38 2022 ] 	Mean test loss of 258 batches: 0.8025355769913326.
[ Wed Sep 28 05:47:38 2022 ] 	Top1: 76.28%
[ Wed Sep 28 05:47:38 2022 ] 	Top5: 96.17%
[ Wed Sep 28 05:47:38 2022 ] Training epoch: 59
[ Wed Sep 28 05:50:46 2022 ] 	Mean training loss: 0.5331. loss2: 0.0000. Mean training acc: 83.23%.
[ Wed Sep 28 05:50:46 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 05:50:46 2022 ] Eval epoch: 59
[ Wed Sep 28 05:51:15 2022 ] 	Mean test loss of 258 batches: 0.7176414509845335.
[ Wed Sep 28 05:51:15 2022 ] 	Top1: 78.84%
[ Wed Sep 28 05:51:15 2022 ] 	Top5: 96.54%
[ Wed Sep 28 05:51:15 2022 ] Training epoch: 60
[ Wed Sep 28 05:54:23 2022 ] 	Mean training loss: 0.5375. loss2: 0.0000. Mean training acc: 82.98%.
[ Wed Sep 28 05:54:23 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 05:54:23 2022 ] Eval epoch: 60
[ Wed Sep 28 05:54:52 2022 ] 	Mean test loss of 258 batches: 0.9602099072563556.
[ Wed Sep 28 05:54:52 2022 ] 	Top1: 72.83%
[ Wed Sep 28 05:54:52 2022 ] 	Top5: 94.03%
[ Wed Sep 28 05:54:52 2022 ] Training epoch: 61
[ Wed Sep 28 05:58:00 2022 ] 	Mean training loss: 0.5189. loss2: 0.0000. Mean training acc: 83.58%.
[ Wed Sep 28 05:58:00 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 05:58:00 2022 ] Eval epoch: 61
[ Wed Sep 28 05:58:29 2022 ] 	Mean test loss of 258 batches: 0.763529563019442.
[ Wed Sep 28 05:58:29 2022 ] 	Top1: 77.49%
[ Wed Sep 28 05:58:29 2022 ] 	Top5: 95.92%
[ Wed Sep 28 05:58:29 2022 ] Training epoch: 62
[ Wed Sep 28 06:01:37 2022 ] 	Mean training loss: 0.5321. loss2: 0.0000. Mean training acc: 83.05%.
[ Wed Sep 28 06:01:37 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 06:01:37 2022 ] Eval epoch: 62
[ Wed Sep 28 06:02:06 2022 ] 	Mean test loss of 258 batches: 0.9647802852613981.
[ Wed Sep 28 06:02:06 2022 ] 	Top1: 71.47%
[ Wed Sep 28 06:02:06 2022 ] 	Top5: 93.73%
[ Wed Sep 28 06:02:06 2022 ] Training epoch: 63
[ Wed Sep 28 06:05:14 2022 ] 	Mean training loss: 0.5321. loss2: 0.0000. Mean training acc: 83.27%.
[ Wed Sep 28 06:05:14 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 06:05:14 2022 ] Eval epoch: 63
[ Wed Sep 28 06:05:43 2022 ] 	Mean test loss of 258 batches: 0.7039754481394161.
[ Wed Sep 28 06:05:43 2022 ] 	Top1: 79.11%
[ Wed Sep 28 06:05:43 2022 ] 	Top5: 96.27%
[ Wed Sep 28 06:05:43 2022 ] Training epoch: 64
[ Wed Sep 28 06:08:51 2022 ] 	Mean training loss: 0.5281. loss2: 0.0000. Mean training acc: 83.24%.
[ Wed Sep 28 06:08:51 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 06:08:51 2022 ] Eval epoch: 64
[ Wed Sep 28 06:09:20 2022 ] 	Mean test loss of 258 batches: 0.7662890797206597.
[ Wed Sep 28 06:09:20 2022 ] 	Top1: 76.41%
[ Wed Sep 28 06:09:20 2022 ] 	Top5: 96.11%
[ Wed Sep 28 06:09:20 2022 ] Training epoch: 65
[ Wed Sep 28 06:12:28 2022 ] 	Mean training loss: 0.5198. loss2: 0.0000. Mean training acc: 83.52%.
[ Wed Sep 28 06:12:28 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 06:12:28 2022 ] Eval epoch: 65
[ Wed Sep 28 06:12:57 2022 ] 	Mean test loss of 258 batches: 0.7440216396321622.
[ Wed Sep 28 06:12:57 2022 ] 	Top1: 77.49%
[ Wed Sep 28 06:12:57 2022 ] 	Top5: 95.88%
[ Wed Sep 28 06:12:57 2022 ] Training epoch: 66
[ Wed Sep 28 06:16:05 2022 ] 	Mean training loss: 0.5306. loss2: 0.0000. Mean training acc: 83.37%.
[ Wed Sep 28 06:16:05 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 06:16:05 2022 ] Eval epoch: 66
[ Wed Sep 28 06:16:34 2022 ] 	Mean test loss of 258 batches: 0.6669138468397681.
[ Wed Sep 28 06:16:34 2022 ] 	Top1: 80.23%
[ Wed Sep 28 06:16:34 2022 ] 	Top5: 96.33%
[ Wed Sep 28 06:16:34 2022 ] Training epoch: 67
[ Wed Sep 28 06:19:42 2022 ] 	Mean training loss: 0.5246. loss2: 0.0000. Mean training acc: 83.51%.
[ Wed Sep 28 06:19:42 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 06:19:42 2022 ] Eval epoch: 67
[ Wed Sep 28 06:20:11 2022 ] 	Mean test loss of 258 batches: 0.7857418160798938.
[ Wed Sep 28 06:20:11 2022 ] 	Top1: 77.29%
[ Wed Sep 28 06:20:11 2022 ] 	Top5: 95.00%
[ Wed Sep 28 06:20:11 2022 ] Training epoch: 68
[ Wed Sep 28 06:23:19 2022 ] 	Mean training loss: 0.5319. loss2: 0.0000. Mean training acc: 83.21%.
[ Wed Sep 28 06:23:19 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 06:23:19 2022 ] Eval epoch: 68
[ Wed Sep 28 06:23:48 2022 ] 	Mean test loss of 258 batches: 0.7579514283773511.
[ Wed Sep 28 06:23:48 2022 ] 	Top1: 77.13%
[ Wed Sep 28 06:23:48 2022 ] 	Top5: 96.12%
[ Wed Sep 28 06:23:48 2022 ] Training epoch: 69
[ Wed Sep 28 06:26:56 2022 ] 	Mean training loss: 0.5235. loss2: 0.0000. Mean training acc: 83.37%.
[ Wed Sep 28 06:26:56 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 06:26:56 2022 ] Eval epoch: 69
[ Wed Sep 28 06:27:25 2022 ] 	Mean test loss of 258 batches: 0.7441913276448731.
[ Wed Sep 28 06:27:25 2022 ] 	Top1: 78.52%
[ Wed Sep 28 06:27:25 2022 ] 	Top5: 95.38%
[ Wed Sep 28 06:27:25 2022 ] Training epoch: 70
[ Wed Sep 28 06:30:32 2022 ] 	Mean training loss: 0.5231. loss2: 0.0000. Mean training acc: 83.55%.
[ Wed Sep 28 06:30:32 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 06:30:32 2022 ] Eval epoch: 70
[ Wed Sep 28 06:31:01 2022 ] 	Mean test loss of 258 batches: 0.7532272180036981.
[ Wed Sep 28 06:31:02 2022 ] 	Top1: 78.90%
[ Wed Sep 28 06:31:02 2022 ] 	Top5: 95.45%
[ Wed Sep 28 06:31:02 2022 ] Training epoch: 71
[ Wed Sep 28 06:34:09 2022 ] 	Mean training loss: 0.5316. loss2: 0.0000. Mean training acc: 83.09%.
[ Wed Sep 28 06:34:09 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 06:34:09 2022 ] Eval epoch: 71
[ Wed Sep 28 06:34:38 2022 ] 	Mean test loss of 258 batches: 0.668596927451995.
[ Wed Sep 28 06:34:39 2022 ] 	Top1: 80.10%
[ Wed Sep 28 06:34:39 2022 ] 	Top5: 96.18%
[ Wed Sep 28 06:34:39 2022 ] Training epoch: 72
[ Wed Sep 28 06:37:46 2022 ] 	Mean training loss: 0.5198. loss2: 0.0000. Mean training acc: 83.54%.
[ Wed Sep 28 06:37:46 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 06:37:46 2022 ] Eval epoch: 72
[ Wed Sep 28 06:38:16 2022 ] 	Mean test loss of 258 batches: 0.6822033578804297.
[ Wed Sep 28 06:38:16 2022 ] 	Top1: 79.40%
[ Wed Sep 28 06:38:16 2022 ] 	Top5: 96.40%
[ Wed Sep 28 06:38:16 2022 ] Training epoch: 73
[ Wed Sep 28 06:41:24 2022 ] 	Mean training loss: 0.5266. loss2: 0.0000. Mean training acc: 83.33%.
[ Wed Sep 28 06:41:24 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 06:41:24 2022 ] Eval epoch: 73
[ Wed Sep 28 06:41:53 2022 ] 	Mean test loss of 258 batches: 0.705725052733292.
[ Wed Sep 28 06:41:54 2022 ] 	Top1: 78.69%
[ Wed Sep 28 06:41:54 2022 ] 	Top5: 96.05%
[ Wed Sep 28 06:41:54 2022 ] Training epoch: 74
[ Wed Sep 28 06:45:01 2022 ] 	Mean training loss: 0.5216. loss2: 0.0000. Mean training acc: 83.51%.
[ Wed Sep 28 06:45:01 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 06:45:01 2022 ] Eval epoch: 74
[ Wed Sep 28 06:45:30 2022 ] 	Mean test loss of 258 batches: 0.7305469477130461.
[ Wed Sep 28 06:45:30 2022 ] 	Top1: 77.71%
[ Wed Sep 28 06:45:30 2022 ] 	Top5: 96.08%
[ Wed Sep 28 06:45:30 2022 ] Training epoch: 75
[ Wed Sep 28 06:48:38 2022 ] 	Mean training loss: 0.5199. loss2: 0.0000. Mean training acc: 83.57%.
[ Wed Sep 28 06:48:38 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 06:48:38 2022 ] Eval epoch: 75
[ Wed Sep 28 06:49:07 2022 ] 	Mean test loss of 258 batches: 0.7842775578646697.
[ Wed Sep 28 06:49:07 2022 ] 	Top1: 77.09%
[ Wed Sep 28 06:49:07 2022 ] 	Top5: 95.89%
[ Wed Sep 28 06:49:07 2022 ] Training epoch: 76
[ Wed Sep 28 06:52:15 2022 ] 	Mean training loss: 0.5221. loss2: 0.0000. Mean training acc: 83.59%.
[ Wed Sep 28 06:52:15 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 06:52:15 2022 ] Eval epoch: 76
[ Wed Sep 28 06:52:44 2022 ] 	Mean test loss of 258 batches: 0.7446857012750566.
[ Wed Sep 28 06:52:44 2022 ] 	Top1: 77.53%
[ Wed Sep 28 06:52:44 2022 ] 	Top5: 95.52%
[ Wed Sep 28 06:52:44 2022 ] Training epoch: 77
[ Wed Sep 28 06:55:52 2022 ] 	Mean training loss: 0.5287. loss2: 0.0000. Mean training acc: 83.33%.
[ Wed Sep 28 06:55:52 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 06:55:52 2022 ] Eval epoch: 77
[ Wed Sep 28 06:56:21 2022 ] 	Mean test loss of 258 batches: 0.7768782060160193.
[ Wed Sep 28 06:56:21 2022 ] 	Top1: 76.69%
[ Wed Sep 28 06:56:21 2022 ] 	Top5: 95.76%
[ Wed Sep 28 06:56:21 2022 ] Training epoch: 78
[ Wed Sep 28 06:59:28 2022 ] 	Mean training loss: 0.5267. loss2: 0.0000. Mean training acc: 83.42%.
[ Wed Sep 28 06:59:28 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 06:59:29 2022 ] Eval epoch: 78
[ Wed Sep 28 06:59:58 2022 ] 	Mean test loss of 258 batches: 0.6846878306232682.
[ Wed Sep 28 06:59:58 2022 ] 	Top1: 79.20%
[ Wed Sep 28 06:59:58 2022 ] 	Top5: 96.66%
[ Wed Sep 28 06:59:58 2022 ] Training epoch: 79
[ Wed Sep 28 07:03:05 2022 ] 	Mean training loss: 0.5177. loss2: 0.0000. Mean training acc: 83.57%.
[ Wed Sep 28 07:03:05 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 07:03:05 2022 ] Eval epoch: 79
[ Wed Sep 28 07:03:35 2022 ] 	Mean test loss of 258 batches: 0.6511654853820801.
[ Wed Sep 28 07:03:35 2022 ] 	Top1: 80.30%
[ Wed Sep 28 07:03:35 2022 ] 	Top5: 96.37%
[ Wed Sep 28 07:03:35 2022 ] Training epoch: 80
[ Wed Sep 28 07:06:42 2022 ] 	Mean training loss: 0.5160. loss2: 0.0000. Mean training acc: 83.59%.
[ Wed Sep 28 07:06:42 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 07:06:42 2022 ] Eval epoch: 80
[ Wed Sep 28 07:07:12 2022 ] 	Mean test loss of 258 batches: 0.820560643492743.
[ Wed Sep 28 07:07:12 2022 ] 	Top1: 75.81%
[ Wed Sep 28 07:07:12 2022 ] 	Top5: 94.83%
[ Wed Sep 28 07:07:12 2022 ] Training epoch: 81
[ Wed Sep 28 07:10:19 2022 ] 	Mean training loss: 0.5200. loss2: 0.0000. Mean training acc: 83.60%.
[ Wed Sep 28 07:10:19 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 07:10:19 2022 ] Eval epoch: 81
[ Wed Sep 28 07:10:49 2022 ] 	Mean test loss of 258 batches: 0.6468400950002116.
[ Wed Sep 28 07:10:49 2022 ] 	Top1: 80.12%
[ Wed Sep 28 07:10:49 2022 ] 	Top5: 96.57%
[ Wed Sep 28 07:10:49 2022 ] Training epoch: 82
[ Wed Sep 28 07:13:57 2022 ] 	Mean training loss: 0.5244. loss2: 0.0000. Mean training acc: 83.49%.
[ Wed Sep 28 07:13:57 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 07:13:57 2022 ] Eval epoch: 82
[ Wed Sep 28 07:14:26 2022 ] 	Mean test loss of 258 batches: 0.6834122234190158.
[ Wed Sep 28 07:14:26 2022 ] 	Top1: 79.74%
[ Wed Sep 28 07:14:26 2022 ] 	Top5: 96.23%
[ Wed Sep 28 07:14:26 2022 ] Training epoch: 83
[ Wed Sep 28 07:17:34 2022 ] 	Mean training loss: 0.5232. loss2: 0.0000. Mean training acc: 83.40%.
[ Wed Sep 28 07:17:34 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 07:17:34 2022 ] Eval epoch: 83
[ Wed Sep 28 07:18:03 2022 ] 	Mean test loss of 258 batches: 0.6349329099405644.
[ Wed Sep 28 07:18:03 2022 ] 	Top1: 80.94%
[ Wed Sep 28 07:18:03 2022 ] 	Top5: 96.53%
[ Wed Sep 28 07:18:03 2022 ] Training epoch: 84
[ Wed Sep 28 07:21:11 2022 ] 	Mean training loss: 0.5244. loss2: 0.0000. Mean training acc: 83.52%.
[ Wed Sep 28 07:21:11 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 07:21:11 2022 ] Eval epoch: 84
[ Wed Sep 28 07:21:40 2022 ] 	Mean test loss of 258 batches: 0.7256442025419354.
[ Wed Sep 28 07:21:40 2022 ] 	Top1: 78.20%
[ Wed Sep 28 07:21:40 2022 ] 	Top5: 95.96%
[ Wed Sep 28 07:21:40 2022 ] Training epoch: 85
[ Wed Sep 28 07:24:48 2022 ] 	Mean training loss: 0.5227. loss2: 0.0000. Mean training acc: 83.62%.
[ Wed Sep 28 07:24:48 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 07:24:48 2022 ] Eval epoch: 85
[ Wed Sep 28 07:25:17 2022 ] 	Mean test loss of 258 batches: 1.022892995282661.
[ Wed Sep 28 07:25:17 2022 ] 	Top1: 71.35%
[ Wed Sep 28 07:25:17 2022 ] 	Top5: 94.07%
[ Wed Sep 28 07:25:17 2022 ] Training epoch: 86
[ Wed Sep 28 07:28:25 2022 ] 	Mean training loss: 0.5148. loss2: 0.0000. Mean training acc: 83.71%.
[ Wed Sep 28 07:28:25 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 07:28:25 2022 ] Eval epoch: 86
[ Wed Sep 28 07:28:54 2022 ] 	Mean test loss of 258 batches: 0.7348022188327109.
[ Wed Sep 28 07:28:55 2022 ] 	Top1: 77.81%
[ Wed Sep 28 07:28:55 2022 ] 	Top5: 95.35%
[ Wed Sep 28 07:28:55 2022 ] Training epoch: 87
[ Wed Sep 28 07:32:02 2022 ] 	Mean training loss: 0.5181. loss2: 0.0000. Mean training acc: 83.55%.
[ Wed Sep 28 07:32:02 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 07:32:02 2022 ] Eval epoch: 87
[ Wed Sep 28 07:32:32 2022 ] 	Mean test loss of 258 batches: 0.7032597300040629.
[ Wed Sep 28 07:32:32 2022 ] 	Top1: 78.90%
[ Wed Sep 28 07:32:32 2022 ] 	Top5: 95.72%
[ Wed Sep 28 07:32:32 2022 ] Training epoch: 88
[ Wed Sep 28 07:35:39 2022 ] 	Mean training loss: 0.5210. loss2: 0.0000. Mean training acc: 83.37%.
[ Wed Sep 28 07:35:39 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 07:35:40 2022 ] Eval epoch: 88
[ Wed Sep 28 07:36:09 2022 ] 	Mean test loss of 258 batches: 0.8855203086322592.
[ Wed Sep 28 07:36:09 2022 ] 	Top1: 75.46%
[ Wed Sep 28 07:36:09 2022 ] 	Top5: 94.36%
[ Wed Sep 28 07:36:09 2022 ] Training epoch: 89
[ Wed Sep 28 07:39:17 2022 ] 	Mean training loss: 0.5201. loss2: 0.0000. Mean training acc: 83.56%.
[ Wed Sep 28 07:39:17 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 07:39:17 2022 ] Eval epoch: 89
[ Wed Sep 28 07:39:46 2022 ] 	Mean test loss of 258 batches: 0.7509134473033654.
[ Wed Sep 28 07:39:46 2022 ] 	Top1: 77.37%
[ Wed Sep 28 07:39:46 2022 ] 	Top5: 95.68%
[ Wed Sep 28 07:39:46 2022 ] Training epoch: 90
[ Wed Sep 28 07:42:54 2022 ] 	Mean training loss: 0.5201. loss2: 0.0000. Mean training acc: 83.64%.
[ Wed Sep 28 07:42:54 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 07:42:54 2022 ] Eval epoch: 90
[ Wed Sep 28 07:43:23 2022 ] 	Mean test loss of 258 batches: 0.8204321294098862.
[ Wed Sep 28 07:43:23 2022 ] 	Top1: 75.20%
[ Wed Sep 28 07:43:23 2022 ] 	Top5: 96.05%
[ Wed Sep 28 07:43:23 2022 ] Training epoch: 91
[ Wed Sep 28 07:46:31 2022 ] 	Mean training loss: 0.3121. loss2: 0.0000. Mean training acc: 90.37%.
[ Wed Sep 28 07:46:31 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 07:46:31 2022 ] Eval epoch: 91
[ Wed Sep 28 07:47:00 2022 ] 	Mean test loss of 258 batches: 0.42572794744903725.
[ Wed Sep 28 07:47:00 2022 ] 	Top1: 87.04%
[ Wed Sep 28 07:47:00 2022 ] 	Top5: 97.93%
[ Wed Sep 28 07:47:00 2022 ] Training epoch: 92
[ Wed Sep 28 07:50:08 2022 ] 	Mean training loss: 0.2493. loss2: 0.0000. Mean training acc: 92.44%.
[ Wed Sep 28 07:50:08 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 07:50:08 2022 ] Eval epoch: 92
[ Wed Sep 28 07:50:37 2022 ] 	Mean test loss of 258 batches: 0.4166258246341879.
[ Wed Sep 28 07:50:37 2022 ] 	Top1: 87.42%
[ Wed Sep 28 07:50:37 2022 ] 	Top5: 97.95%
[ Wed Sep 28 07:50:37 2022 ] Training epoch: 93
[ Wed Sep 28 07:53:45 2022 ] 	Mean training loss: 0.2255. loss2: 0.0000. Mean training acc: 93.09%.
[ Wed Sep 28 07:53:45 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 07:53:45 2022 ] Eval epoch: 93
[ Wed Sep 28 07:54:14 2022 ] 	Mean test loss of 258 batches: 0.41594328950772913.
[ Wed Sep 28 07:54:14 2022 ] 	Top1: 87.46%
[ Wed Sep 28 07:54:14 2022 ] 	Top5: 98.08%
[ Wed Sep 28 07:54:14 2022 ] Training epoch: 94
[ Wed Sep 28 07:57:22 2022 ] 	Mean training loss: 0.2009. loss2: 0.0000. Mean training acc: 93.99%.
[ Wed Sep 28 07:57:22 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 07:57:22 2022 ] Eval epoch: 94
[ Wed Sep 28 07:57:51 2022 ] 	Mean test loss of 258 batches: 0.4067412822168003.
[ Wed Sep 28 07:57:51 2022 ] 	Top1: 88.03%
[ Wed Sep 28 07:57:51 2022 ] 	Top5: 98.01%
[ Wed Sep 28 07:57:51 2022 ] Training epoch: 95
[ Wed Sep 28 08:00:58 2022 ] 	Mean training loss: 0.1904. loss2: 0.0000. Mean training acc: 94.21%.
[ Wed Sep 28 08:00:58 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 08:00:59 2022 ] Eval epoch: 95
[ Wed Sep 28 08:01:28 2022 ] 	Mean test loss of 258 batches: 0.41616527666879255.
[ Wed Sep 28 08:01:28 2022 ] 	Top1: 87.64%
[ Wed Sep 28 08:01:28 2022 ] 	Top5: 97.95%
[ Wed Sep 28 08:01:28 2022 ] Training epoch: 96
[ Wed Sep 28 08:04:36 2022 ] 	Mean training loss: 0.1800. loss2: 0.0000. Mean training acc: 94.53%.
[ Wed Sep 28 08:04:36 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 08:04:36 2022 ] Eval epoch: 96
[ Wed Sep 28 08:05:05 2022 ] 	Mean test loss of 258 batches: 0.42478897832622825.
[ Wed Sep 28 08:05:05 2022 ] 	Top1: 87.40%
[ Wed Sep 28 08:05:05 2022 ] 	Top5: 97.90%
[ Wed Sep 28 08:05:05 2022 ] Training epoch: 97
[ Wed Sep 28 08:08:13 2022 ] 	Mean training loss: 0.1692. loss2: 0.0000. Mean training acc: 94.99%.
[ Wed Sep 28 08:08:13 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 08:08:13 2022 ] Eval epoch: 97
[ Wed Sep 28 08:08:42 2022 ] 	Mean test loss of 258 batches: 0.41587594496591607.
[ Wed Sep 28 08:08:42 2022 ] 	Top1: 87.77%
[ Wed Sep 28 08:08:42 2022 ] 	Top5: 97.93%
[ Wed Sep 28 08:08:42 2022 ] Training epoch: 98
[ Wed Sep 28 08:11:49 2022 ] 	Mean training loss: 0.1642. loss2: 0.0000. Mean training acc: 95.04%.
[ Wed Sep 28 08:11:49 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 08:11:49 2022 ] Eval epoch: 98
[ Wed Sep 28 08:12:19 2022 ] 	Mean test loss of 258 batches: 0.42848349366943506.
[ Wed Sep 28 08:12:19 2022 ] 	Top1: 87.26%
[ Wed Sep 28 08:12:19 2022 ] 	Top5: 98.00%
[ Wed Sep 28 08:12:19 2022 ] Training epoch: 99
[ Wed Sep 28 08:15:26 2022 ] 	Mean training loss: 0.1504. loss2: 0.0000. Mean training acc: 95.57%.
[ Wed Sep 28 08:15:26 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 08:15:26 2022 ] Eval epoch: 99
[ Wed Sep 28 08:15:56 2022 ] 	Mean test loss of 258 batches: 0.4543614677160747.
[ Wed Sep 28 08:15:56 2022 ] 	Top1: 87.14%
[ Wed Sep 28 08:15:56 2022 ] 	Top5: 97.76%
[ Wed Sep 28 08:15:56 2022 ] Training epoch: 100
[ Wed Sep 28 08:19:03 2022 ] 	Mean training loss: 0.1443. loss2: 0.0000. Mean training acc: 95.81%.
[ Wed Sep 28 08:19:03 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 08:19:03 2022 ] Eval epoch: 100
[ Wed Sep 28 08:19:33 2022 ] 	Mean test loss of 258 batches: 0.45992705165300257.
[ Wed Sep 28 08:19:33 2022 ] 	Top1: 86.81%
[ Wed Sep 28 08:19:33 2022 ] 	Top5: 97.83%
[ Wed Sep 28 08:19:33 2022 ] Training epoch: 101
[ Wed Sep 28 08:22:41 2022 ] 	Mean training loss: 0.1194. loss2: 0.0000. Mean training acc: 96.65%.
[ Wed Sep 28 08:22:41 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 08:22:41 2022 ] Eval epoch: 101
[ Wed Sep 28 08:23:10 2022 ] 	Mean test loss of 258 batches: 0.41833743105043275.
[ Wed Sep 28 08:23:10 2022 ] 	Top1: 87.88%
[ Wed Sep 28 08:23:10 2022 ] 	Top5: 97.90%
[ Wed Sep 28 08:23:10 2022 ] Training epoch: 102
[ Wed Sep 28 08:26:18 2022 ] 	Mean training loss: 0.1110. loss2: 0.0000. Mean training acc: 96.96%.
[ Wed Sep 28 08:26:18 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 08:26:18 2022 ] Eval epoch: 102
[ Wed Sep 28 08:26:47 2022 ] 	Mean test loss of 258 batches: 0.41753666453756566.
[ Wed Sep 28 08:26:47 2022 ] 	Top1: 88.06%
[ Wed Sep 28 08:26:47 2022 ] 	Top5: 98.05%
[ Wed Sep 28 08:26:47 2022 ] Training epoch: 103
[ Wed Sep 28 08:29:55 2022 ] 	Mean training loss: 0.1011. loss2: 0.0000. Mean training acc: 97.32%.
[ Wed Sep 28 08:29:55 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 08:29:55 2022 ] Eval epoch: 103
[ Wed Sep 28 08:30:24 2022 ] 	Mean test loss of 258 batches: 0.4200501806543086.
[ Wed Sep 28 08:30:24 2022 ] 	Top1: 87.99%
[ Wed Sep 28 08:30:24 2022 ] 	Top5: 97.99%
[ Wed Sep 28 08:30:24 2022 ] Training epoch: 104
[ Wed Sep 28 08:33:32 2022 ] 	Mean training loss: 0.1001. loss2: 0.0000. Mean training acc: 97.34%.
[ Wed Sep 28 08:33:32 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 08:33:32 2022 ] Eval epoch: 104
[ Wed Sep 28 08:34:01 2022 ] 	Mean test loss of 258 batches: 0.4173173560912526.
[ Wed Sep 28 08:34:01 2022 ] 	Top1: 88.08%
[ Wed Sep 28 08:34:01 2022 ] 	Top5: 98.02%
[ Wed Sep 28 08:34:01 2022 ] Training epoch: 105
[ Wed Sep 28 08:37:08 2022 ] 	Mean training loss: 0.1006. loss2: 0.0000. Mean training acc: 97.31%.
[ Wed Sep 28 08:37:08 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 08:37:08 2022 ] Eval epoch: 105
[ Wed Sep 28 08:37:37 2022 ] 	Mean test loss of 258 batches: 0.41998067101131575.
[ Wed Sep 28 08:37:38 2022 ] 	Top1: 88.07%
[ Wed Sep 28 08:37:38 2022 ] 	Top5: 97.99%
[ Wed Sep 28 08:37:38 2022 ] Training epoch: 106
[ Wed Sep 28 08:40:45 2022 ] 	Mean training loss: 0.0959. loss2: 0.0000. Mean training acc: 97.49%.
[ Wed Sep 28 08:40:45 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 08:40:45 2022 ] Eval epoch: 106
[ Wed Sep 28 08:41:14 2022 ] 	Mean test loss of 258 batches: 0.4348307376395362.
[ Wed Sep 28 08:41:15 2022 ] 	Top1: 87.67%
[ Wed Sep 28 08:41:15 2022 ] 	Top5: 97.94%
[ Wed Sep 28 08:41:15 2022 ] Training epoch: 107
[ Wed Sep 28 08:44:22 2022 ] 	Mean training loss: 0.0949. loss2: 0.0000. Mean training acc: 97.48%.
[ Wed Sep 28 08:44:22 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 08:44:22 2022 ] Eval epoch: 107
[ Wed Sep 28 08:44:51 2022 ] 	Mean test loss of 258 batches: 0.4247793011627225.
[ Wed Sep 28 08:44:51 2022 ] 	Top1: 87.85%
[ Wed Sep 28 08:44:51 2022 ] 	Top5: 97.97%
[ Wed Sep 28 08:44:51 2022 ] Training epoch: 108
[ Wed Sep 28 08:47:59 2022 ] 	Mean training loss: 0.0924. loss2: 0.0000. Mean training acc: 97.54%.
[ Wed Sep 28 08:47:59 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 08:47:59 2022 ] Eval epoch: 108
[ Wed Sep 28 08:48:28 2022 ] 	Mean test loss of 258 batches: 0.4225018629732058.
[ Wed Sep 28 08:48:28 2022 ] 	Top1: 88.05%
[ Wed Sep 28 08:48:28 2022 ] 	Top5: 98.00%
[ Wed Sep 28 08:48:28 2022 ] Training epoch: 109
[ Wed Sep 28 08:51:36 2022 ] 	Mean training loss: 0.0893. loss2: 0.0000. Mean training acc: 97.67%.
[ Wed Sep 28 08:51:36 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 08:51:36 2022 ] Eval epoch: 109
[ Wed Sep 28 08:52:05 2022 ] 	Mean test loss of 258 batches: 0.4356887838117374.
[ Wed Sep 28 08:52:05 2022 ] 	Top1: 87.70%
[ Wed Sep 28 08:52:05 2022 ] 	Top5: 97.90%
[ Wed Sep 28 08:52:05 2022 ] Training epoch: 110
[ Wed Sep 28 08:55:12 2022 ] 	Mean training loss: 0.0852. loss2: 0.0000. Mean training acc: 97.80%.
[ Wed Sep 28 08:55:12 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 08:55:12 2022 ] Eval epoch: 110
[ Wed Sep 28 08:55:40 2022 ] 	Mean test loss of 258 batches: 0.4220333276099937.
[ Wed Sep 28 08:55:41 2022 ] 	Top1: 88.06%
[ Wed Sep 28 08:55:41 2022 ] 	Top5: 97.98%
[ Wed Sep 28 08:56:10 2022 ] Best accuracy: 0.8807545338751743
[ Wed Sep 28 08:56:10 2022 ] Epoch number: 104
[ Wed Sep 28 08:56:10 2022 ] Model name: work_dir/ntu60/csub/fc_bone_vel
[ Wed Sep 28 08:56:10 2022 ] Model total number of params: 2082097
[ Wed Sep 28 08:56:10 2022 ] Weight decay: 0.0004
[ Wed Sep 28 08:56:10 2022 ] Base LR: 0.1
[ Wed Sep 28 08:56:10 2022 ] Batch Size: 64
[ Wed Sep 28 08:56:10 2022 ] Test Batch Size: 64
[ Wed Sep 28 08:56:10 2022 ] seed: 1
